-
Notifications
You must be signed in to change notification settings - Fork 37
Enable opting out of TSVI #1128
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Benchmark Report for Commit 6a0ecfaComputer InformationBenchmark Results |
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## breaking #1128 +/- ##
============================================
+ Coverage 81.32% 81.69% +0.36%
============================================
Files 40 42 +2
Lines 3807 3922 +115
============================================
+ Hits 3096 3204 +108
- Misses 711 718 +7 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
|
DynamicPPL.jl documentation for PR #1128 is available at: |
62c6351 to
9a74692
Compare
699aa23 to
c35dff5
Compare
* Make InitContext work with OnlyAccsVarInfo * Do not convert NamedTuple to Dict * remove logging * Enable InitFromPrior and InitFromUniform too * Fix `infer_nested_eltype` invocation
9a74692 to
e5a038e
Compare
Performance gains in #1132 are only really significant when not using TSVI.
TSVI gets used more than it should (#1086).
This PR provides a mechanism to globally opt out of TSVI, in a way that is fully backwards-compatible.
Changing this to a global opt-in would just require changing the default to
false.A per-model opt-in would be doable too, albeit requiring more work. I think it's better to start from a global one and only consider a per-model one if people complain.
Performance
Edit: benchmarking this, I just realised that dereferencing
USE_THREADSAFE_EVALgives rise to more overhead than checking Threads.nthreads(). A per-model one should make this work better because if it's stored in the type, it can be removed at compile time. I'll investigate this further.